61 research outputs found
The More Cooperation, the More Competition? A Cournot Analysis of the Benefits of Electric Market Coupling
Market coupling in Belgian and Dutch markets would permit more efficient use of intercountry transmission, 1) by counting only net flows against transmission limits, 2) by improving access to the Belgian market, and 3) by eliminating the mismatch in timing between interface auctions and the energy spot market. A Cournot market model that accounts for the region’s transmission pricing rules and limitations is used to simulate market outcomes with and without market coupling. This accounts for 1) and 2). Market coupling improves social surplus in the order of 108 €/year, unless it encourages the largest producer in the region to switch from a price-taking strategy in Belgium to a Cournot strategy due to a perceived diminishment of the threat of regulatory intervention. Benefit to Dutch consumers depends on the behavior of this company. The results illustrate how large-scale oligopoly models can be useful for assessing market integration.Electric power, Electric transmission, Liberalization, Oligopoly, Complementarity models, Computational models, Netherlands, Belgium, France, Germany, Market Coupling
Planning electricity transmission to accommodate renewables: Using two-stage programming to evaluate flexibility and the cost of disregarding uncertainty
We develop a stochastic two-stage optimisation model that captures the multistage nature of electricity transmission planning under uncertainty and apply it to a stylised representation of the Great Britain (GB) network. In our model, a proactive transmission planner makes investment decisions in two time periods, each time followed by a market response. This model allows us to identify robust first-stage investments and estimate the value of information in transmission planning, the costs of ignoring uncertainty, and the value of flexibility. Our results show that ignoring risk has quantifiable economic consequences, and that considering uncertainty explicitly can yield decisions that have lower expected costs than traditional deterministic planning methods. Furthermore, the best plan under a risk-neutral criterion can differ from the best under risk-aversion
Recommended from our members
Planning electricity transmission to accommodate renewables: Using two-stage programming to evaluate flexibility and the cost of disregarding uncertainty
We develop a stochastic two-stage optimisation model that captures the multistage nature of electricity transmission planning under uncertainty and apply it to a stylised representation of the Great Britain (GB) network. In our model, a proactive transmission planner makes investment decisions in two time periods, each time followed by a market response. This model allows us to identify robust first-stage investments and estimate the value of information in transmission planning, the costs of ignoring uncertainty, and the value of flexibility. Our results show that ignoring risk has quantifiable economic consequences, and that considering uncertainty explicitly can yield decisions that have lower expected costs than traditional deterministic planning methods. Furthermore, the best plan under a risk-neutral criterion can differ from the best under risk-aversion
Recommended from our members
Upstream vs. Downstream CO2 Trading: A Comparison for the Electricity Context
In electricity, “downstream” CO2 regulation requires retail suppliers to buy energy from a mix of sources so that their weighted emissions satisfy a standard. It has been argued that such “load-based” regulation would solve emissions leakage, cost consumers less, and provide more incentive for energy efficiency than traditional source-based cap-andtrade programs. Because pure load-based trading complicates spot power markets, variants (GEAC and CO2RC) that separate emissions attributes from energy have been proposed. When all energy producers and consumers come under such a system, these load-based programs are equivalent to source-based trading in which emissions allowances are allocated by various rules, and have no necessary cost advantage. The GEAC and CO2RC systems are equivalent to giving allowances free to generators, and requiring consumers either to subsidize generation or buy back excess allowances, respectively. As avoided energy costs under source-based and pure load-based trading are equal, the latter provides no additional incentive for energy efficiency. The speculative benefits of load-based systems are unjustified in light of their additional administrative complexity and cost, the threat that they pose to the competitiveness and efficiency of electricity spot markets, and the complications that would arise when transition to a federal cap-and-trade system occurs
Opportunity Cost Bidding by Wind Generators in Forward Markets: Analytical Results
Wind generation must trade in forward electricity markets based on imperfect forecasts of its output and real-time prices. When the real-time price differs for generators that are short and long, the optimal forward strategy must be based on the opportunity costs of charges and payments in real-time rather than a central estimate of wind output. We present analytical results for wind's optimal forward strategy. In the risk-neutral case, the optimal strategy is determined by the distribution of real-time available wind capacity, and the expected real-time prices conditioned on the forward price and wind out-turn; our approach is simpler and more computationally efficient than formulations requiring specification of full joint distributions or a large set of scenarios. Informative closed-form examples are derived for particular specifications of the wind-price dependence structure. In the usual case of uncertain forward prices, the optimal bidding strategy generally consists of a bid curve for wind power, rather than a fixed quantity bid. A discussion of the risk-averse problem is also provided. An analytical result is available for aversion to production volume risk; however, we doubt whether wind owners should be risk-averse with respect to the income from a single settlement period, given the large number of such periods in a year
Network-constrained models of liberalized electricity markets: the devil is in the details
Numerical models for electricity markets are frequently used to inform and support decisions. How robust are the results? Three research groups used the same, realistic data set for generators, demand and transmission network as input for their numerical models. The results coincide when predicting competitive market results. In the strategic case in which large generators can exercise market power, the predicted prices differed significantly. The results are highly sensitive to assumptions about market design, timing of the market and assumptions about constraints on the rationality of generators. Given the same assumptions the results coincide. We provide a checklist for users to understand the implications of different modelling assumptions.Market power, Electricity, Networks, Numeric models, Model comparison
The scientific potential of space-based gravitational wave detectors
The millihertz gravitational wave band can only be accessed with a
space-based interferometer, but it is one of the richest in potential sources.
Observations in this band have amazing scientific potential. The mergers
between massive black holes with mass in the range 10 thousand to 10 million
solar masses, which are expected to occur following the mergers of their host
galaxies, produce strong millihertz gravitational radiation. Observations of
these systems will trace the hierarchical assembly of structure in the Universe
in a mass range that is very difficult to probe electromagnetically. Stellar
mass compact objects falling into such black holes in the centres of galaxies
generate detectable gravitational radiation for several years prior to the
final plunge and merger with the central black hole. Measurements of these
systems offer an unprecedented opportunity to probe the predictions of general
relativity in the strong-field and dynamical regime. Millihertz gravitational
waves are also generated by millions of ultra-compact binaries in the Milky
Way, providing a new way to probe galactic stellar populations. ESA has
recognised this great scientific potential by selecting The Gravitational
Universe as its theme for the L3 large satellite mission, scheduled for launch
in ~2034. In this article we will review the likely sources for millihertz
gravitational wave detectors and describe the wide applications that
observations of these sources could have for astrophysics, cosmology and
fundamental physics.Comment: 18 pages, 2 figures, contribution to Gravitational Wave Astrophysics,
the proceedings of the 2014 Sant Cugat Forum on Astrophysics; v2 includes one
additional referenc
State of the climate in 2013
In 2013, the vast majority of the monitored climate variables reported here maintained trends established in recent decades. ENSO was in a neutral state during the entire year, remaining mostly on the cool side of neutral with modest impacts on regional weather patterns around the world. This follows several years dominated by the effects of either La Niña or El Niño events. According to several independent analyses, 2013 was again among the 10 warmest years on record at the global scale, both at the Earths surface and through the troposphere. Some regions in the Southern Hemisphere had record or near-record high temperatures for the year. Australia observed its hottest year on record, while Argentina and New Zealand reported their second and third hottest years, respectively. In Antarctica, Amundsen-Scott South Pole Station reported its highest annual temperature since records began in 1957. At the opposite pole, the Arctic observed its seventh warmest year since records began in the early 20th century. At 20-m depth, record high temperatures were measured at some permafrost stations on the North Slope of Alaska and in the Brooks Range. In the Northern Hemisphere extratropics, anomalous meridional atmospheric circulation occurred throughout much of the year, leading to marked regional extremes of both temperature and precipitation. Cold temperature anomalies during winter across Eurasia were followed by warm spring temperature anomalies, which were linked to a new record low Eurasian snow cover extent in May. Minimum sea ice extent in the Arctic was the sixth lowest since satellite observations began in 1979. Including 2013, all seven lowest extents on record have occurred in the past seven years. Antarctica, on the other hand, had above-average sea ice extent throughout 2013, with 116 days of new daily high extent records, including a new daily maximum sea ice area of 19.57 million km2 reached on 1 October. ENSO-neutral conditions in the eastern central Pacific Ocean and a negative Pacific decadal oscillation pattern in the North Pacific had the largest impacts on the global sea surface temperature in 2013. The North Pacific reached a historic high temperature in 2013 and on balance the globally-averaged sea surface temperature was among the 10 highest on record. Overall, the salt content in nearsurface ocean waters increased while in intermediate waters it decreased. Global mean sea level continued to rise during 2013, on pace with a trend of 3.2 mm yr-1 over the past two decades. A portion of this trend (0.5 mm yr-1) has been attributed to natural variability associated with the Pacific decadal oscillation as well as to ongoing contributions from the melting of glaciers and ice sheets and ocean warming. Global tropical cyclone frequency during 2013 was slightly above average with a total of 94 storms, although the North Atlantic Basin had its quietest hurricane season since 1994. In the Western North Pacific Basin, Super Typhoon Haiyan, the deadliest tropical cyclone of 2013, had 1-minute sustained winds estimated to be 170 kt (87.5 m s-1) on 7 November, the highest wind speed ever assigned to a tropical cyclone. High storm surge was also associated with Haiyan as it made landfall over the central Philippines, an area where sea level is currently at historic highs, increasing by 200 mm since 1970. In the atmosphere, carbon dioxide, methane, and nitrous oxide all continued to increase in 2013. As in previous years, each of these major greenhouse gases once again reached historic high concentrations. In the Arctic, carbon dioxide and methane increased at the same rate as the global increase. These increases are likely due to export from lower latitudes rather than a consequence of increases in Arctic sources, such as thawing permafrost. At Mauna Loa, Hawaii, for the first time since measurements began in 1958, the daily average mixing ratio of carbon dioxide exceeded 400 ppm on 9 May. The state of these variables, along with dozens of others, and the 2013 climate conditions of regions around the world are discussed in further detail in this 24th edition of the State of the Climate series. © 2014, American Meteorological Society. All rights reserved
- …